On the coordination of gestures of humanoids
نویسندگان
چکیده
Hand gestures are important means of expressivity for humanoids. In this paper by humanoid we understand user-controlled or autonomous virtual humans (VHs) or conversational agents (ECAs) [4] as well as human-like robots [18]. We cover both cases of human-humanoid and humanoid-humanoid interaction. The semantics, the morphology, the variations in performance of gestures reflecting cultural, affective and other characteristics of the speaker [8]. as well as general gesture movement laws [6] have been addressed Our focus in this paper is the issue of coordination of hand gestures to external signals. One type of coordination, alignment of speech-accompanying gestures to the speech, has been studied extensively, and different design principles have been formulated and implemented for specific applications with virtual humans [11, 13, 25]. In these cases, the phonological synchrony rule [15] have been taken as basis, usually resulting in gestures timed to the speech – even if it is generated by TTS. An exception is [24], where in assembly tasks, where a physical manipulation may be accomplished in a shorter or longer time, the speech is aligned to the manipulative hand gestures. Another domain where two-handed gestures play a role is sign language [9]. Also, mechanisms for fast planning for deictic gestures have been proposed [14]. Our ongoing research extends these works in the following aspects: • We propose a coordination scheme which is more general, allowing to take into consideration external events such as tempo indication or perceived state information about the interlocutor of the ECA. • We allow the declaration of coordination requirements on a low level of granularity, looking at different stages of gestures. Such a refined approach makes it possible to perform experiments on e.g. expressivity and style, and to include timing strategies as a means to fine-tune the gesturing behavior of a humanoid. • Our main interest is in reactive scheduling and planning of gestures with reference to an environment influencing the timing of the gestures. • We are using the (still under development) BML language for the formulation of scheduling requirements. As BML is meant to become a general-purpose markup language [3], our testing and extension of its constructs contributes to the development of this unifying language.
منابع مشابه
Mental Timeline in Persian Speakers’ Co-speech Gestures based on Lakoff and Johnson’s Conceptual Metaphor Theory
One of the introduced conceptual metaphors is the metaphor of "time as space". Time as an abstract concept is conceptualized by a concrete concept like space. This conceptualization of time is also reflected in co-speech gestures. In this research, we try to find out what dimension and direction the mental timeline has in co-speech gestures and under the influence of which one of the metaphoric...
متن کاملComparison of Plantar Dynamics During Four Sports Gestures in Rugby Players
Background. Four of the most relevant gestures in rugby (RU) are the pass, the tackle, the line out, and the scrum. RU is the third most common contact sport on the planet, and being a fast-paced collision game and carries a high risk of injury. Objectives. To describe and compare plantar dynamics during four sports gestures in rugby players through speed, strength, and balance. Methods. Twen...
متن کاملReusable Gestures for Interactive Web Agents
In this paper we present an approach to define reusable gestures for embodied web agents, given according to the H-anim standard for VRML. We identify the dimensions to compare gestures and criteria to circumscribe a set of gestures an avatar should be endowed with. Based on these dimensions and criteria, we propose a uniform way to define a wide range of gestures that may be adapted to particu...
متن کاملHuman Computer Interaction Using Vision-Based Hand Gesture Recognition
With the rapid emergence of 3D applications and virtual environments in computer systems; the need for a new type of interaction device arises. This is because the traditional devices such as mouse, keyboard, and joystick become inefficient and cumbersome within these virtual environments. In other words, evolution of user interfaces shapes the change in the Human-Computer Interaction (HCI). In...
متن کاملHuman Computer Interaction Using Vision-Based Hand Gesture Recognition
With the rapid emergence of 3D applications and virtual environments in computer systems; the need for a new type of interaction device arises. This is because the traditional devices such as mouse, keyboard, and joystick become inefficient and cumbersome within these virtual environments. In other words, evolution of user interfaces shapes the change in the Human-Computer Interaction (HCI). In...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007